Nonparametric time dependent principal components analysis.
نویسندگان
چکیده
منابع مشابه
Nonparametric Principal Components Regression
In ordinary least squares regression, dimensionality is a sensitive issue. As the number of independent variables approaches the sample size, the least squares algorithm could easily fail, i.e., estimates are not unique or very unstable, (Draper and Smith, 1981). There are several problems usually encountered in modeling high dimensional data, including the difficulty of visualizing the data, s...
متن کاملPersian Handwriting Analysis Using Functional Principal Components
Principal components analysis is a well-known statistical method in dealing with large dependent data sets. It is also used in functional data for both purposes of data reduction as well as variation representation. On the other hand "handwriting" is one of the objects, studied in various statistical fields like pattern recognition and shape analysis. Considering time as the argument,...
متن کامل"Spaghetti" PCA analysis: An extension of principal components analysis to time dependent interval data
In this paper a we present an extension of Principal Component Analysis to analyse time dependent interval data. Indeed, in our approach each observation is characterized by an oriented interval of values with a starting and an ending value for each period of observation: for example, the open and the close price of a share in a stock market for a day or a week, initial expression value and fin...
متن کاملOnline Principal Components Analysis
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of ddimensional vectors X = [x1, . . . ,xn] and a target dimension k < d; the output is a set of k-dimensional vectors Y = [y1, . . . ,yn] that minimize the reconstruction error: minΦ ∑ i ‖xi − Φyi‖2. Here, Φ ∈ Rd×k is restricted to being isometric. The...
متن کاملPrincipal Components Analysis
Derivation of PCA I: For a set of d-dimensional data vectors {x}i=1, the principal axes {e}qj=1 are those orthonormal axes onto which the retained variance under projection is maximal. It can be shown that the vectors ej are given by the q dominant eigenvectors of the sample covariance matrix S, such that Sej = λjej . The q principal components of the observed vector xi are given by the vector ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ANZIAM Journal
سال: 2003
ISSN: 1445-8810
DOI: 10.21914/anziamj.v44i0.699